On the Problem of Local Minima in Backpropagation

نویسندگان

  • Marco Gori
  • Alberto Tesi
چکیده

Supervised Learning in Multi-Layered Neural Networks (MLNs) has been recently proposed through the well-known Backpropagation algorithm. This is a gradient method which can get stuck in local minima, as simple examples can show. In this paper, some conditions on the network architecture and the learning environment are proposed which ensure the convergence of the Backpropagation algorithm. It is proven in particular that the convergence holds if the classes are linearly-separable. In this case, the experience gained in several experiments shows that MLNs exceed perceptrons in generalization to new examples.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Modified Error Function with Added Terms for the Backpropagation Algorithm

We have noted that the local minima problem in the backpropagation algorithm is usually caused by update disharmony between weights connected to the hidden layer and the output layer. To solve this problem, we propose a modified error function with added terms. By adding one term to the conventional error function, the modified error function can harmonize the update of weights connected to the...

متن کامل

Stochastics of on-line back-propagation

We study on-line backpropagation and show that the existing theoretical descriptions are strictly valid only on relatively short time scales or in the vicinity of (local) minima of the backpropagation error potential. Qualitative global features (e.g., why is it much easier to escape from local minima than from global minima) may also be explained by these local descriptions, but the current ap...

متن کامل

A new approach for data visualization problem

Data visualization is the process of transforming data, information, and knowledge into visual form, making use of humans’ natural visual capabilities which reveals relationships in data sets that are not evident from the raw data, by using mathematical techniques to reduce the number of dimensions in the data set while preserving the relevant inherent properties. In this paper, we formulated d...

متن کامل

An Modified Error Function for the Complex-value Backpropagation Neural Networks

The complex-valued backpropagation algorithm has been widely used in fields dealing with telecommunications, speech recognition, and image processing with Fourier transformation. However, the local minima problem usually occurs in the process of learning. To solve this problem and to speed up the learning process, we propose a modified error function. we added a term to the conventional error f...

متن کامل

Suspiciousness of Loading Problems

We introduce the notion of suspect families of loading problems in the attempt of formalizing situations in which classical learning algorithms based on local optimization are likely to fail (because of local minima or numerical precision problems). We show that any loading problem belonging to a non-suspect family can be solved with optimal complexity by a canonical form of gradient descent wi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Pattern Anal. Mach. Intell.

دوره 14  شماره 

صفحات  -

تاریخ انتشار 1992